Normalization of input patterns in an associative network.

نویسندگان

  • Andreas Liu
  • Wade G Regehr
چکیده

Numerous brain structures have a cerebellum-like architecture in which inputs diverge onto a large number of granule cells that converge onto principal cells. Plasticity at granule cell-to-principal cell synapses is thought to allow these structures to associate spatially distributed patterns of granule cell activity with appropriate principal cell responses. Storing large sets of associations requires the patterns involved to be normalized, i.e., to have similar total amounts of granule cell activity. Using a general model of associative learning, we describe two ways in which granule cells can be configured to promote normalization. First, we show how heterogeneity in firing thresholds across granule cells can restrict pattern-to-pattern variation in total activity while also limiting spatial overlap between patterns. These effects combine to allow fast and flexible learning. Second, we show that the perceptron learning rule selectively silences those synapses that contribute most to pattern-to-pattern variation in the total input to a principal cell. This provides a simple functional interpretation for the experimental observation that many granule cell-to-Purkinje cell synapses in the cerebellum are silent. Since our model is quite general, these principles may apply to a wide range of associative circuits.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

AN IMPROVED CONTROLLED CHAOTIC NEURAL NETWORK FOR PATTERN RECOGNITION

A sigmoid function is necessary for creation a chaotic neural network (CNN). In this paper, a new function for CNN is proposed that it can increase the speed of convergence. In the proposed method, we use a novel signal for controlling chaos. Both the theory analysis and computer simulation results show that the performance of CNN can be improved remarkably by using our method. By means of this...

متن کامل

CHAPTER III Neural Networks as Associative Memory

Associative memories can be implemented either by using feedforward or recurrent neural networks. Such associative neural networks are used to associate one set of vectors with another set of vectors, say input and output patterns. The aim of an associative memory is, to produce the associated output pattern whenever one of the input pattern is applied to the neural network. The input pattern m...

متن کامل

A Neuronet Algorithm for Recognizing a Large Number of Highly Correlated Patterns

The paper gives a simple algorithm that allows us to eliminate correlation between input binary patterns by changing their dimensionality. A neural network that is a variant of vector associative memory is used to recognize redimensioned patterns. Having capacity much greater than conventional neural networks, the resulting associative memory can recognize highly noisy and correlated input patt...

متن کامل

Complex Associative Memory Neural Network Model for Invariant Pattern Recognition

A complex associative memory neural network (CAMN2) model is proposed for the recognition of handwritten characters. The input and the stored patterns here are derived from the complex valued representation of the boundary of the characters. The stored vector representation is formulated based on 1-D representation of an optical pattern recognition filter. Retrieval of stored patterns from a no...

متن کامل

Performance Analysis of Hopfield Model of Neural Network with Evolutionary Approach for Pattern Recalling

ABSTRACT In the present paper, an effort has been made to compare and analyze the performance for pattern recalling with conventional hebbian learning rule and with evolutionary algorithm in Hopfield Model of feedback Neural Networks. A set of ten objects has been considered as the pattern set. In the Hopfield type of neural networks of associative memory, the weighted code of input patterns pr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of neurophysiology

دوره 111 3  شماره 

صفحات  -

تاریخ انتشار 2014